Improving Combination Methods of Neural Classifiers Using NCL

نویسندگان

  • Arash Iranzad
  • Saeed Masoudnia
  • Fatemeh Cheraghchi
  • Abbas Nowzari-Dalini
  • Reza Ebrahimpour
چکیده

In this paper the effect of diversity caused by Negative Correlation Learning (NCL) in the combination of neural classifier is investigated and an efficient way to improve combining performance is presented. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are selected as base ensemble learner and NCL version of them are compared with them in our experiments. Utilizing NCL for diversifying the base classifiers leads to significantly better results in all employed combining methods. Experimental results on five datasets from UCI Machine Learning Repository indicate that by employing NCL, the performance of the ensemble structure can be more favorable compared to that of an ensemble use independent base classifiers.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

استفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی

This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...

متن کامل

A New Approach to Combine Classifiers Trained by NCL

In this paper we propose a new way to combine classifiers trained and diverged by NCL that leads to superior results in compare with averaging and DTs methods alone. In the proposed method after training classifiers by NCL, DTs and averaging are employed independently to combine them and then the outputs are combined again by averaging. For the second level combination, support vectors are scal...

متن کامل

Improving Identification of Difficult Small Classes by Balancing Class Distribution

We studied three methods to improve identification of difficult small classes by balancing imbalanced class distribution with data reduction. The new method, neighborhood cleaning rule (NCL), outperformed simple random and one-sided selection methods in experiments with ten data sets. All reduction methods improved identification of small classes (20-30%), but the differences were insignificant...

متن کامل

Machine Learning Methods to Analyze Arabidopsis Thaliana Plant Root Growth

AbstractOne of the challenging problems in biology is to classify plants based on their reaction on genetic mutation. Arabidopsis Thaliana is a plant that is so interesting, because its genetic structure has some similarities with that of human beings. Biologists classify the type of this plant to mutated and not mutated (wild) types. Phenotypic analysis of these types is a time-consuming and c...

متن کامل

افزایش نرخ کارایی طبقه بندی با استفاده از تجمیع ویژگی های موثر روش های مختلف ترکیب شبکه های عصبی

Both theoretical and experimental studies have shown that combining accurate Neural Networks (NN) in the ensemble with negative error correlation greatly improves their generalization abilities. Negative Correlation Learning (NCL) and Mixture of Experts (ME), two popular combining methods, each employ different special error functions for the simultaneous training of NN experts to produce negat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012